Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Lightweight attention mechanism module based on squeeze and excitation
Zhenhu LYU, Xinzheng XU, Fangyan ZHANG
Journal of Computer Applications    2022, 42 (8): 2353-2360.   DOI: 10.11772/j.issn.1001-9081.2021061037
Abstract496)   HTML26)    PDF (1124KB)(156)       Save

Focusing on the issue that embedding the attention mechanism module into Convolutional Neural Network (CNN) to improve the application accuracy will increase the parameters and the computational cost, the lightweight Height Dimensional Squeeze and Excitation (HD-SE) module and Width Dimensional Squeeze and Excitation (WD-SE) module based on squeeze and excitation were proposed. To make full use of the potential information in the feature maps, two kinds of height and width dimensional weight information of feature maps was respectively extracted by HD-SE and WD-SE through squeeze and excitation operations, then the obtained weight information was respectively applied to corresponding tensors of the feature maps of two dimensions to improve the application accuracy of the model. Experiments were implemented on CIFAR10 and CIFAR100 datasets after embedding HD-SE and WD-SE into Visual Geometry Group 16 (VGG16), Residual Network 56 (ResNet56), MobileNetV1 and MobileNetV2 models respectively. Experimental results show fewer parameters and computational cost added by HD-SE and WD-SE to the network models when the models achieve the same or even better accuracy, compared with the state-of-the-art attention mechanism modules, such as Squeeze and Excitation (SE) module, Coordinate Attention (CA) block, Convolutional Block Attention Module (CBAM) and Efficient Channel Attention (ECA) module.

Table and Figures | Reference | Related Articles | Metrics